A Block Sparsity Approach to Multiple Dictionary Learning for Audio Modeling

نویسنده

  • Gautham J. Mysore
چکیده

Dictionary learning algorithms for audio modeling typically learn a dictionary such that each time frame of the given sound source is approximately equal to a linear combination of the dictionary elements. Since audio is non-stationary data, learning a single dictionary to explain all time frames of the sound source might not be the best modeling strategy. We therefore recently proposed a technique to jointly learn multiple dictionaries such that each time frame of the given sound source is approximately equal to a linear combination of the dictionary elements from one of the many dictionaries. This is equivalent to modeling each time frame with a small subset of all of the dictionary elements in the model, which is analogous to block sparsity on the mixture weights over all dictionary elements. In this paper, we show why there is inherent block sparsity in our model due to its hierarchical nature and why this is useful for audio applications.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Speech Enhancement using Adaptive Data-Based Dictionary Learning

In this paper, a speech enhancement method based on sparse representation of data frames has been presented. Speech enhancement is one of the most applicable areas in different signal processing fields. The objective of a speech enhancement system is improvement of either intelligibility or quality of the speech signals. This process is carried out using the speech signal processing techniques ...

متن کامل

A Shift Tolerant Dictionary Training Method

Traditional dictionary learning method work by vectorizing long signals, and training on the frames of the data, thereby restricting the learning to time-localized atoms. We study a shift-tolerant approach to learning dictionaries, whereby the features are learned by training on shifted versions of the signal of interest. We propose an optimized Subspace Clustering learning method to accommodat...

متن کامل

CLUSTERBASED DICTIONARY LEARNING AND LOCALITYCONSTRAINED SPARSE RECONSTRUCTION FOR TRAJECTORY CLASSIFICATION College of Mechanical Electronic and Information Engineering

Trajectory classification has been extensively investigated in recent years, however, the problems about automatically modeling unlabeled and incomplete trajectories are far from being solved. In this paper, we propose a Cluster-based Dictionary Learning (CDL) approach that firstly constructs an initial cluster-based dictionary by K-means clustering and incrementally updates by exploring the im...

متن کامل

Task-Driven Dictionary Learning for HyperspectralImage Classification with Structured SparsityConstraints Sparse representation models a signal as a linear combination of a small number of dictionary

Task-Driven Dictionary Learning for HyperspectralImage Classification with Structured SparsityConstraints Report Title Sparse representation models a signal as a linear combination of a small number of dictionary atoms. As a generative model, it requires the dictionary to be highly redundant in order to ensure both a stable high sparsity level and a low reconstruction error for the signal. Howe...

متن کامل

Per-Block-Convex Data Modeling by Accelerated Stochastic Approximation

Applications involving dictionary learning, non-negative matrix factorization, subspace clustering, and parallel factor tensor decomposition tasks motivate well algorithms for per-block-convex and non-smooth optimization problems. By leveraging the stochastic approximation paradigm and first-order acceleration schemes, this paper develops an online and modular learning algorithm for a large cla...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012